翻訳と辞書
Words near each other
・ Conlig
・ Conlig railway station
・ Conlin
・ Conlin McCabe
・ Conlin's Furniture
・ Conjugate diameters
・ Conjugate element (field theory)
・ Conjugate eye movement
・ Conjugate focal plane
・ Conjugate Fourier series
・ Conjugate gaze palsy
・ Conjugate gradient method
・ Conjugate index
・ Conjugate points
・ Conjugate prior
Conjugate residual method
・ Conjugate transpose
・ Conjugate vaccine
・ Conjugate variables
・ Conjugate variables (thermodynamics)
・ Conjugate-permutable subgroup
・ Conjugated estrogen
・ Conjugated fatty acid
・ Conjugated linoleic acid
・ Conjugated microporous polymer
・ Conjugated protein
・ Conjugated system
・ Conjugation
・ Conjugation of auxiliary Catalan verbs
・ Conjugation of isometries in Euclidean space


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Conjugate residual method : ウィキペディア英語版
Conjugate residual method
The conjugate residual method is an iterative numeric method used for solving systems of linear equations. It's a Krylov subspace method very similar to the much more popular conjugate gradient method, with similar construction and convergence properties.
This method is used to solve linear equations of the form
:\mathbf A \mathbf x = \mathbf b
where A is an invertible and Hermitian matrix, and b is nonzero.
The conjugate residual method differs from the closely related conjugate gradient method primarily in that it involves more numerical operations and requires more storage, but the system matrix is only required to be Hermitian, not symmetric positive definite.
Given an (arbitrary) initial estimate of the solution \mathbf x_0, the method is outlined below:
:
\begin
& \mathbf_0 := \text \\
& \mathbf_0 := \mathbf - \mathbf_0 \\
& \mathbf_0 := \mathbf_0 \\
& \text k \text 0:\\
& \qquad \alpha_k := \frac \mathbf_k} \mathbf_k} \\
& \qquad \mathbf_ := \mathbf_k + \alpha_k \mathbf_k \\
& \qquad \mathbf_ := \mathbf_k - \alpha_k \mathbf_k \\
& \qquad \beta_k := \frac^\mathrm \mathbf_} \mathbf_k} \\
& \qquad \mathbf_ := \mathbf_ + \beta_k \mathbf_k \\
& \qquad \mathbf_ := \mathbf_ + \beta_k \mathbf_k \\
& \qquad k := k + 1
\end

the iteration may be stopped once \mathbf x_k has been deemed converged. The only difference between this and the conjugate gradient method is the calculation of \alpha_k and \beta_k (plus the optional incremental calculation of \mathbf_k at the end).
Note: the above algorithm can be transformed so to make only one symmetric matrix-vector multiplication in each iteration.
==Preconditioning==

By making a few substitutions and variable changes, a preconditioned conjugate residual method may be derived in the same way as done for the conjugate gradient method:
:
\begin
& \mathbf x_0 := \text \\
& \mathbf r_0 := \mathbf M^(\mathbf b - \mathbf_0) \\
& \mathbf p_0 := \mathbf r_0 \\
& \text k \text 0: \\
& \qquad \alpha_k := \frac \mathbf M^ \mathbf_k} \\
& \qquad \mathbf x_ := \mathbf x_k + \alpha_k \mathbf_k \\
& \qquad \mathbf r_ := \mathbf r_k - \alpha_k \mathbf M^ \mathbf_k \\
& \qquad \beta_k := \frac \mathbf A \mathbf r_} \\
& \qquad \mathbf p_ := \mathbf r_ + \beta_k \mathbf_k \\
& \qquad \mathbf_ := \mathbf A \mathbf r_ + \beta_k \mathbf_k \\
& \qquad k := k + 1 \\
\end

The preconditioner \mathbf M^ must be symmetric. Note that the residual vector here is different from the residual vector without preconditioning.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Conjugate residual method」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.